Selective Kernel Res-Attention UNet: Deep Learning for Generating Decorrelation Mask With Applications to TanDEM-X Interferograms

نویسندگان

چکیده

Decorrelation is one of the main limitations for InSAR. Masking decorrelated pixels crucial retrieving information from SAR interferograms. However, traditional masking methods, manually drawing masks time-consuming and may be unfeasible when decorrelation areas are with complicated blurred boundaries. Setting a single coherence threshold also difficult, if not impossible, to mask out all without losing valid phases. Here, we propose deep-learning segmentation network (Mask Net) based on Selective Kernel Res-Attention UNet, generating applications TanDEM-X We conduct several experiments determine training strategy parameters, including sample size, batch loss function down-sampling scheme, optimize performance. Afterwards, compare performance Mask Net other classical networks. Our evaluation metrics show that outperforms best networks by IoU 6.32% F1 Score 3.97%, respectively. It possesses fastest inferring speed, 0.4505s size 1024-by-1024 pixels, which at least ~50% faster than applied three interferograms Klauea crater in Hawaii, metropolitan region Wuhan, Muztagata Glacier China. results comparing method, can clearly regions, rarely causing exhibits better networks, especially those complex boundaries, less computational time.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kernel Methods for Deep Learning

We introduce a new family of positive-definite kernel functions that mimic the computation in large, multilayer neural nets. These kernel functions can be used in shallow architectures, such as support vector machines (SVMs), or in deep kernel-based architectures that we call multilayer kernel machines (MKMs). We evaluate SVMs and MKMs with these kernel functions on problems designed to illustr...

متن کامل

Deep Kernel Learning

We introduce scalable deep kernels, which combine the structural properties of deep learning architectures with the nonparametric flexibility of kernel methods. Specifically, we transform the inputs of a spectral mixture base kernel with a deep architecture, using local kernel interpolation, inducing points, and structure exploiting (Kronecker and Toeplitz) algebra for a scalable kernel represe...

متن کامل

Atmospheric Error, Phase Trend and Decorrelation Noise in Terrasar-x Differential Interferograms

The potential of TerraSAR-X data for deformation measurements benefits from high spatial and temporal resolution and even allow detecting higher deformation gradients compared to other sensors. However, decorrelation noise from vegetation and atmospheric effects are of greater importance because of the shorter wavelength. The extent of a TerraSAR-X scene is small compared to common scale atmosp...

متن کامل

To understand deep learning we need to understand kernel learning

Generalization performance of classifiers in deep learning has recently become a subject of intense study. Deep models, which are typically heavily over-parametrized, tend to fit the training data exactly. Despite this overfitting, they perform well on test data, a phenomenon not yet fully understood. The first point of our paper is that strong performance of overfitted classifiers is not a uni...

متن کامل

Generating Text with Deep Reinforcement Learning

We introduce a novel schema for sequence to sequence learning with a Deep QNetwork (DQN), which decodes the output sequence iteratively. The aim here is to enable the decoder to first tackle easier portions of the sequences, and then turn to cope with difficult parts. Specifically, in each iteration, an encoder-decoder Long Short-Term Memory (LSTM) network is employed to, from the input sequenc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing

سال: 2021

ISSN: ['2151-1535', '1939-1404']

DOI: https://doi.org/10.1109/jstars.2021.3105703